Affect differentially modulates brain activation in uni- and multisensory body-voice perception.

نویسندگان

  • Sarah Jessen
  • Sonja A Kotz
چکیده

Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception. While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Evidence for Training-Induced Plasticity in Multisensory Brain Structures: An MEG Study

Multisensory learning and resulting neural brain plasticity have recently become a topic of renewed interest in human cognitive neuroscience. Music notation reading is an ideal stimulus to study multisensory learning, as it allows studying the integration of visual, auditory and sensorimotor information processing. The present study aimed at answering whether multisensory learning alters uni-se...

متن کامل

Language/Culture Modulates Brain and Gaze Processes in Audiovisual Speech Perception

Several behavioural studies have shown that the interplay between voice and face information in audiovisual speech perception is not universal. Native English speakers (ESs) are influenced by visual mouth movement to a greater degree than native Japanese speakers (JSs) when listening to speech. However, the biological basis of these group differences is unknown. Here, we demonstrate the time-va...

متن کامل

Cross-cultural differences in the multisensory perception of emotion

Our recent study [1] showed that culture modulates the manner of the multisensory integration of affective information. Specifically, Japanese are more tuned to vocal processing than Dutch in the multisensory perception of emotion. The current study aimed to extend the findings by adding an experiment and conducting further analyses on the results of their study. In the experiments, pairs of af...

متن کامل

Perception of facial expressions and voices and of their combination in the human brain.

Using positron emission tomography we explored brain regions activated during the perception of face expressions, emotional voices and combined audio-visual pairs. A convergence region situated in the left lateral temporal cortex was more activated by bimodal stimuli than by either visual only or auditory only stimuli. Separate analyses for the emotions happiness and fear revealed supplementary...

متن کامل

Combined perception of emotion in pictures and musical sounds.

Evaluation of emotional scenes requires integration of information from different modality channels, most frequently from audition and vision. Neither the psychological nor neural basis of auditory-visual interactions during the processing of affect is well understood. In this study, possible interactions in affective processing were investigated via event-related potential (ERP) recordings dur...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neuropsychologia

دوره 66  شماره 

صفحات  -

تاریخ انتشار 2015